11 research outputs found

    Environmental control by remote eye tracking

    Get PDF
    Eye movement interfacing can be found in some specially designed environmental control systems (ECSs) for people with severe disability. Typically this requires the user to sit in front of a computer monitor and their eye gaze direction is then detected which controls the cursor position on the screen. The ECS screen usually consists of a number of icons representing different controllable devices and an eye fixation landing within a pre-defined icon area then activates a selection for control. Such systems are widely used in homes, offices, schools, hospitals, and long-term care facilities. Wellings and Unsworth (1997) demonstrated that a user-friendly interface design is the weak link in ECS technology, in particular for severely disabled people. Disabled individuals need straightforward control of their immediate surroundings and so making a detailed menu selection by techniques, such as eye-screen interaction, can be a difficult and tedious process for some individuals. This situation can be exasperated by real-world issues such as eye tracking systems which do not tolerate user’s head movement. This paper presents a different approach to environmental control using eye gaze selection, in which the control options applicable to a given device are automatically pre-selected by means of the user directly looking at the device in their environment. This intuitive method therefore minimises the amount of navigation that the user must perform. To date, two main methods have been employed to achieve this direct eye-device control. The initial development using a head-mounted eye tracker was previously reported (Shi et al., 2006). This current paper describes subsequent development of the system (Shi et al., 2007) using a remote eye tracker which is simply situated before the user with no need for any attachment to them

    Look in, turn on, drop out

    Get PDF
    The ART (Attention-Responsive Technology) research project is developing a system to enable mobility-impaired individuals to access technology efficiently. The system monitors both the individual and any ICT devices in his/her environment. It then uses the individual’s gaze direction to determine to which ICT device, if any, they are potentially attending. This information is relayed to a userconfigurable control panel, which then displays only those controls that are appropriate, both to the user and to the particular device in question. The user can then choose to operate the device if s/he wishes. The initial ergonomic challenges in the development of the ART system are described

    Helping people with ICT device control by eye gaze

    Get PDF
    This paper presents a computer method to help people, typically having limited mobility, to be able to operate ICT devices with eye gaze in their living/work environment. The user’s eye gaze is recorded and analyzed in real-time. Any ICT device in the environment that is being looked at for a certain time period is identified, located and assumed to be the object of interest that the user wants to utilise. Through a suitable interface, the user can then decide whether to operate the device. By using this state-of-the-art technology, people with impaired mobility, or able bodied people whose movements are restricted can attain a more independent life style

    Eye-centric ICT control

    Get PDF
    There are many ways of interfacing with ICT devices but where the end user is mobility restricted then the interface designer has to become much more innovative. One approach is to employ the user’s eye movements to initiate control operations but this has well known problems of the measured eye gaze location not always corresponding to the user’s actual visual attention. We have developed a methodology which overcomes these problems. The user’s environment is imaged continuously and interrogated, using SIFT image analysis algorithms, for the presence of known ICT devices. The locations of these ICT devices are then related mathematically to the measured eye gaze location of a user. The technical development of the approach and its current status are described

    Vision responsive technology to assist people with limited mobility

    Get PDF
    Automation in the home or office environment plays an increasingly important part in people’s lives. The control of such automated systems can be achieved by means as simple as pressing buttons of a remote control or as complex as using user’s speech. There are some situations where a user’s mobility is limited due to either disability, age or environmental hazards. Provided that the user can move his/her eyes properly, an approach that enables control by vision, i.e. eye selection, can be useful

    X10 - are you looking at me?

    Get PDF
    Various disabilities restrict the ease with which individuals can operate electronic and ICT devices. X10 is a system for home automation control and consequently lends itself for use by disabled individuals, who particularly have mobility restrictions, to control a wide range of devices although the resultant user interface can be cumbersome. The development of an adequate user-centred interface/control which will allow such an individual easily to operate multiple ICT devices is then a considerable challenge. The development of a technique that utilises a user’s point of gaze to select a particular ICT device for subsequent operation, thereby simplifying the user interface, is described. All ICT devices in the environment are first digitally imaged from different angles to identify them to a computer imaging system. Subsequently each device can be automatically recognised. The user’s eye movements are recorded and their direction of gaze related in real time to the known 3D location of the possible ICT devices so enabling device selection prior to operation. The development of the technique and current ongoing research status are described

    Exploring eye responsive control - from a head mounted to a remote system

    Get PDF
    The Attention Responsive Technology (ART) system is designed to enable control of the environment by individuals for whom movement is difficult or undesirable. This paper reports additional development of the ART system through replacing its initial head-mounted eye-tracking technology with a remotely mounted tracking system. The new system can release the user from the need to wear any head-mounted equipment, thus improving user comfort and acceptability. Instead, eye tracking cameras and the scene camera are situated in a fixed position some small distance from the user; these then allow tracking of the user’s eye gaze and field of view, respectively. This system would suit many situations in which the user remains seated, for example, in a wheelchair or before a workstation onto which the cameras can be mounted

    Direct gaze based environmental controls

    Get PDF
    Living in modern times, people at home are greatly enjoying the convenience brought about by advanced technologies. With steadily increased home automation applications, it is becoming more and more popular for individuals to use one central control interface to set up and operate of all the audio, video and many household appliances in a home. However, such interfaces often are too complicated for people with a disability to operate. However, the technology has long been available to achieve Environmental Control (EC) for disabled people with limited mobility, which then helps them live with more independence. This paper presents a specially designed EC system for use by people who have lost significant mobility but who have good control of their eye movements. Through attention responsive technology, a user will be able to perform either simple or complex operations of any electrical household appliance by directly gazing at it

    A new gaze-based interface for environmental control

    Get PDF
    This paper describes a new control system interface which utilises the user’s eye gaze to enable severely disabled individuals control electronic devices easily. The system is based upon a novel human computer interface, which facilitates simple control of electronic devices by predicting and responding to the user’s possible intentions, based intuitively upon their point of gaze. The interface responds by automatically pre-selecting and offering only those controls appropriate to the specific device that the user looks at, in a simple and accessible manner. It therefore affords the user conscious choice of the appropriate range of control actions required, which may be executed by simple means and without the need to navigate manually through potentially complex control menus to reach them. Two systems using the head-mounted and the remote eye tracker respectively are introduced, compared and evaluated in this paper

    Sift approach matching constraints for real-time attention responsive system

    No full text
    To enable mobility-impaired individuals to access technology more efficiently, an ART (Attention Responsive Technology) System is currently being developed, which aims to use a person’s eye movements to enable the simple control of some devices in the room/office environment. One of the important parts of the system is to recognize the objects around the user in three-dimensional (3D) space. This paper addresses a solution to the matching problems for 3D object recognition using Lowe’s SIFT approach by applying appropriate rules and constraints in particular for objects with less surface contents
    corecore